37 research outputs found

    Enhancement of Metaheuristic Algorithm for Scheduling Workflows in Multi-fog Environments

    Get PDF
    Whether in computer science, engineering, or economics, optimization lies at the heart of any challenge involving decision-making. Choosing between several options is part of the decision- making process. Our desire to make the "better" decision drives our decision. An objective function or performance index describes the assessment of the alternative's goodness. The theory and methods of optimization are concerned with picking the best option. There are two types of optimization methods: deterministic and stochastic. The first is a traditional approach, which works well for small and linear problems. However, they struggle to address most of the real-world problems, which have a highly dimensional, nonlinear, and complex nature. As an alternative, stochastic optimization algorithms are specifically designed to tackle these types of challenges and are more common nowadays. This study proposed two stochastic, robust swarm-based metaheuristic optimization methods. They are both hybrid algorithms, which are formulated by combining Particle Swarm Optimization and Salp Swarm Optimization algorithms. Further, these algorithms are then applied to an important and thought-provoking problem. The problem is scientific workflow scheduling in multiple fog environments. Many computer environments, such as fog computing, are plagued by security attacks that must be handled. DDoS attacks are effectively harmful to fog computing environments as they occupy the fog's resources and make them busy. Thus, the fog environments would generally have fewer resources available during these types of attacks, and then the scheduling of submitted Internet of Things (IoT) workflows would be affected. Nevertheless, the current systems disregard the impact of DDoS attacks occurring in their scheduling process, causing the amount of workflows that miss deadlines as well as increasing the amount of tasks that are offloaded to the cloud. Hence, this study proposed a hybrid optimization algorithm as a solution for dealing with the workflow scheduling issue in various fog computing locations. The proposed algorithm comprises Salp Swarm Algorithm (SSA) and Particle Swarm Optimization (PSO). In dealing with the effects of DDoS attacks on fog computing locations, two Markov-chain schemes of discrete time types were used, whereby one calculates the average network bandwidth existing in each fog while the other determines the number of virtual machines existing in every fog on average. DDoS attacks are addressed at various levels. The approach predicts the DDoS attackā€™s influences on fog environments. Based on the simulation results, the proposed method can significantly lessen the amount of offloaded tasks that are transferred to the cloud data centers. It could also decrease the amount of workflows with missed deadlines. Moreover, the significance of green fog computing is growing in fog computing environments, in which the consumption of energy plays an essential role in determining maintenance expenses and carbon dioxide emissions. The implementation of efficient scheduling methods has the potential to mitigate the usage of energy by allocating tasks to the most appropriate resources, considering the energy efficiency of each individual resource. In order to mitigate these challenges, the proposed algorithm integrates the Dynamic Voltage and Frequency Scaling (DVFS) technique, which is commonly employed to enhance the energy efficiency of processors. The experimental findings demonstrate that the utilization of the proposed method, combined with the Dynamic Voltage and Frequency Scaling (DVFS) technique, yields improved outcomes. These benefits encompass a minimization in energy consumption. Consequently, this approach emerges as a more environmentally friendly and sustainable solution for fog computing environments

    Spectrophotometric Determination of Cerium in Some Ore in Kurdistan Region ā€“ Iraq

    Get PDF
    A simple and Sensitive Spectrophotometric method was observed for trace measurement of Cerium (IV) in different serpentinite rocks in two different positions in Kurdistan region of Iraq. The method is depended on absorbance measurement at (490 nm.) for the red complex (Ce - Sulphanilic acid) at pH = 4.75 reproducible results were obtained (Recovery 98 ā€“ 103) % for both ores and synthetic Samples of cerium in trace levels. Keywords: red complex, serpentinite rocks, spectrophotometric, ceriu

    Effect of Embedded Length on Laterally Loaded Capacity of Pile Foundation

    Get PDF
    In the analysis of soil-pile interaction under lateral load, the effect of the embedded length of a pile is an important parameter which has a significant influence on its lateral load capacity. This paper investigates the effect of embedded length of a single pile on its ultimate lateral load capacity. In this research, a 30mm steel rod pile was used, and four embedment lengths were selected (250, 300, 350, and 400 mm) for embedment ratios (length to diameter, L/D) = 8.3, 10, 11.7 and 13.3, respectively. The pile was embedded in three different relative densities of sand (loose, medium, and dense). The tests were performed using constant rate displacement method. The results indicate that there is an increase in the capacity of laterally loaded pile up to 247% when embedded ratio (L/D) increases from 8.3 to 13.3 and when the relative density increases from 12% to 85% there is an increase up to 599% in the resistant capacity of the pile

    A Coding-Based Steganography Using Multiple Frequency Domains

    Get PDF
    Abstract:In this paper, a new technique for hiding text in a bitmap images will be present. The technique based on using an index of the dictionary representing the characters of the secret messages instead of the characters themselves. The technique uses multiple frequency domains for embedding these indexes in an arbitrary chosen bitmap image. By using discrete cosine transform DCT, discrete wavelet transform DWT, and a combination of both of them. ŁA software package for implementing this technique are built and we got very good results in terms of capacity of hiding, imperceptibility which are the most two important properties of steganography, the time of hiding the text, and the security issues

    GROUNDWATER DAMS, GENERAL CHARACTERISTICS AND HISTORICAL DEVELOPMENT

    Get PDF
    A groundwater dam is any structure to intercept or obstruct groundwater flow through an aquifer, both natural and artificial, and provide storage for water underground. They are a suitable water supply structure for regions like Pakistan where arid and semi-arid climate conditions are dominated. They can be an alternative solution for conditions when traditional surface dams are not suitable or applicable because of complex geological situation, safety hazards and silting up of dams. However, they cannot be considered for recreation, power generation and as universal method for water supply. By using underground dams for storing water, instead of surface dams, many of the aforesaid problems may be overcome. In addition, besides of their main purpose, providing groundwater storage, underground dams are the most reliable method to prevent saltwater intrusion which is a vital offshore problem. Besides many advantages, there are multiple disadvantages as well; such as inadequate reservoir capacities, more expensive operation costs, and detailed hydrogeological site investigations, aquifer tests before construction. Groundwater dams have been already constructed in several regions around the world, and are not new engineering structures. Historically these kinds of structures were constructed in Roman times in Sardinia and old civilizations in Tunisia of Africa. This paper emphasizes on the significance of this type of structures and reviews the general characteristics and historical development of groundwater dam

    Deep Transfer Learning Networks for Brain Tumor Detection: The Effect of MRI Patient Image Augmentation Methods

    Get PDF
    The exponential growth of deep learning networks has enabled us to handle difficult tasks, even in the complex field of medicine with small datasets. In the sphere of treatment, they are particularly significant. To identify brain tumors, this research examines how three deep learning networks are affected by conventional data augmentation methods, including MobileNetV2, VGG19, and DenseNet201. The findings showed that before and after utilizing approaches, picture augmentation schemes significantly affected the networks. The accuracy of MobileNetV2, which was originally 85.33%, was then enhanced to 96.88%. The accuracy of VGG19, which was 77.33%, was then enhanced to 95.31%, and DenseNet201, which was originally 82.66%, was then enhanced to 93.75%. The models' accuracy percentage engagement change is 13.53%, 23.25%, and 23.25%, respectively. Finally, the conclusion showed that applying data augmentation approaches improves performance, producing models far better than those trained from scratch

    Task scheduling mechanisms for fog computing: A systematic survey

    Get PDF
    In the Internet of Things (IoT) ecosystem, some processing is done near data production sites at higher speeds without the need for high bandwidth by combining Fog Computing (FC) and cloud computing. Fog computing offers advantages for real-time systems that require high speed internet connectivity. Due to the limited resources of fog nodes, one of the most important challenges of FC is to meet dynamic needs in real-time. Therefore, one of the issues in the fog environment is the optimal assignment of tasks to fog nodes. An efficient scheduling algorithm should reduce various qualitative parameters such as cost and energy consumption, taking into account the heterogeneity of fog nodes and the commitment to perform tasks within their deadlines. This study provides a detailed taxonomy to gain a better understanding of the research issues and distinguishes important challenges in existing work. Therefore, a systematic overview of existing task scheduling techniques for cloud-fog environment, as well as their benefits and drawbacks, is presented in this article. Four main categories are introduced to study these techniques, including machine learning-based, heuristic-based, metaheuristic-based, and deterministic mechanisms. A number of papers are studied in each category. This survey also compares different task scheduling techniques in terms of execution time, resource utilization, delay, network bandwidth, energy consumption, execution deadline, response time, cost, uncertainty, and complexity. The outcomes revealed that 38% of the scheduling algorithms use metaheuristic-based mechanisms, 30% use heuristic-based, 23% use machine learning algorithms, and the other 9% use deterministic methods. The energy consumption is the most significant parameter addressed in most articles with a share of 19%. Finally, a number of important areas for improving the task scheduling methods in the FC in the future are presented

    Cryptanalysis of two recent ultra-lightweight authentication protocols

    Get PDF
    Radio Frequency Identification (RFID) technology is a critical part of many Internet of Things (IoT) systems, including Medical IoT (MIoT) for instance. On the other hand, the IoT devicesā€™ numerous limitations (such as memory space, computing capability, and battery capacity) make it difficult to implement cost- and energy-efficient security solutions. As a result, several researchers attempted to address this problem, and several RFID-based security mechanisms for the MIoT and other constrained environments were proposed. In this vein, Wang et al. and Shariq et al. recently proposed CRUSAP and ESRAS ultra-lightweight authentication schemes. They demonstrated, both formally and informally, that their schemes meet the required security properties for RFID systems. In their proposed protocols, they have used a very lightweight operation called Cro(Ā·) and Rank(Ā·), respectively. However, in this paper, we show that those functions are not secure enough to provide the desired security. We show that Cro(Ā·) is linear and reversible, and it is easy to obtain the secret values used in its calculation. Then, by exploiting the vulnerability of the Cro(Ā·) function, we demonstrated that CRUSAP is vulnerable to secret disclosure attacks. The proposed attack has a success probability of "1" and is as simple as a CRUSAP protocol run. Other security attacks are obviously possible by obtaining the secret values of the tag and reader. In addition, we present a de-synchronization attack on the CRUSAP protocol. Furthermore, we provide a thorough examination of ESRAS and its Rank(Ā·) function. We first present a de-synchronization attack that works for any desired Rank(Ā·) function, including Shariq et al.ā€™s proposed Rank(Ā·) function. We also show that Rank(Ā·) does not provide the desired confusion and diffusion that is claimed by the designers. Finally, we conduct a secret disclosure attack against ESRAS
    corecore